8 - Recap Clip 5.1: Complexity Analysis in AI (Part 1) [ID:21851]
29 von 29 angezeigt

Another thing that's going to be important is complexity analysis. How long do our algorithms

take in general? And one of the reasons this is important is that in the public you always

hear these arguments the singularity is near because computing power grows exponentially.

Okay? That's right so far. But we will see that all AI algorithms also grow exponentially,

if not worse. So the question of whether computing power grows exponentially means anything still

needs to be answered. And for that we actually need an idea of how our algorithms perform.

And in the last years that I've been giving this course one of the things I noticed was

that AI students typically have difficulties when given an algorithm of seeing what is the

complexity of that algorithm. So I decided I'll just go for a little recap of that in

the beginning. The first two slides actually are to convince you that complexity actually

matters depending on what complexity class we are in. It is possible to solve big problems

or not. Okay? If we're in somewhere in polynomial time algorithms then we can hope for solutions

even for big problems if you look at that. Right? If you're in the exponential realm

even for relatively small problems in this case I've used a hundred as a size just made

up stuff we are well in the range of I don't care for the result at that time anymore.

Okay? So complexity matters. It really matters in the size of problems we can do. And so

we want to keep an eye on the complexity of our algorithms. Both the worst case complexity,

how long might this thing take, which is relatively simple to come to grips with, but also for

the average case complexity. And very often we can identify simpler cases where the complexity

is better. I'm assuming that you've seen big O of stuff before. We care about these

kind of complexity classes. Quadratic is well tractable. It's well below the growth of computing

power. A polynomial algorithm still are. And with those we kind of feel comfortable. Exponential

is always a warning sign. And of course most of the stuff we're going to do, just as a

little spoiler, is going to be exponential. So I'm assuming you can do big O calculations

and the main idea is that these complexity classes are all contained in each other. And

that actually has a very nice computational consequence. If you have an algorithm that

has parts that are big O of N cubed and you have other parts that are N squared, you can

forget about the N squared. N cubed, big O of N cubed plus big O of N squared is just

bigger O of N cubed. Of course, if you have nested things, then you have to multiply it.

Teil eines Kapitels:
Recaps

Zugänglich über

Offener Zugang

Dauer

00:04:44 Min

Aufnahmedatum

2020-10-26

Hochgeladen am

2020-10-26 12:06:59

Sprache

en-US

Recap: Complexity Analysis in AI (Part 1)

Main video on the topic in chapter 5 clip 1.

Einbetten
Wordpress FAU Plugin
iFrame
Teilen